1,206 research outputs found

    A sampling algorithm to estimate the effect of fluctuations in particle physics data

    Full text link
    Background properties in experimental particle physics are typically estimated using large data sets. However, different events can exhibit different features because of the quantum mechanical nature of the underlying physics processes. While signal and background fractions in a given data set can be evaluated using a maximum likelihood estimator, the shapes of the corresponding distributions are traditionally obtained using high-statistics control samples, which normally neglects the effect of fluctuations. On the other hand, if it was possible to subtract background using templates that take fluctuations into account, this would be expected to improve the resolution of the observables of interest, and to reduce systematics depending on the analysis. This study is an initial step in this direction. We propose a novel algorithm inspired by the Gibbs sampler that makes it possible to estimate the shapes of signal and background probability density functions from a given collection of particles, using control sample templates as initial conditions and refining them to take into account the effect of fluctuations. Results on Monte Carlo data are presented, and the prospects for future development are discussed.Comment: 6 pages, 1 figure. Edited to improve readability in line with the published article. This is based on a condensed version for publication in the Proceedings of the International Conference on Mathematical Modelling in the Physical Sciences, IC-MSQUARE 2012, Budapest, Hungary. A more detailed discussion can be found in the preceding version of this arXiv recor

    Toward particle-level filtering of individual collision events at the Large Hadron Collider and beyond

    Get PDF
    Low-energy strong interactions are a major source of background at hadron colliders, and methods of subtracting the associated energy flow are well established in the field. Traditional approaches treat the contamination as diffuse, and estimate background energy levels either by averaging over large data sets or by restricting to given kinematic regions inside individual collision events. On the other hand, more recent techniques take into account the discrete nature of background, most notably by exploiting the presence of substructure inside hard jets, i.e. inside collections of particles originating from scattered hard quarks and gluons. However, none of the existing methods subtract background at the level of individual particles inside events. We illustrate the use of an algorithm that will allow particle-by-particle background discrimination at the Large Hadron Collider, and we envisage this as the basis for a novel event filtering procedure upstream of the official reconstruction chains. Our hope is that this new technique will improve physics analysis when used in combination with state-of-the-art algorithms in high-luminosity hadron collider environments

    Toward the estimation of background fluctuations under newly-observed signals in particle physics

    Get PDF
    When the number of events associated with a signal process is estimated in particle physics, it is common practice to extrapolate background distributions from control regions to a predefined signal window. This allows accurate estimation of the expected, or average, number of background events under the signal. However, in general, the actual number of background events can deviate from the average due to fluctuations in the data. Such a difference can be sizable when compared to the number of signal events in the early stages of data analysis following the observation of a new particle, as well as in the analysis of rare decay channels. We report on the development of a data-driven technique that aims to estimate the actual, as opposed to the expected, number of background events in a predefined signal window. We discuss results on toy Monte Carlo data and provide a preliminary estimate of systematic uncertainty

    On the frequency distribution of neutral particles from low-energy strong interactions

    Get PDF
    Copyright © 2017 Federico Colecchia and Akram Khan. The rejection of the contamination, or background, from low-energy strong interactions at hadron collider experiments is a topic that has received significant attention in the field of particle physics. This article builds on a particle-level view of collision events, in line with recently-proposed subtraction methods. While conventional techniques in the field usually concentrate on probability distributions, our study is, to our knowledge, the first attempt at estimating the frequency distribution of background particles across the kinematic space inside individual collision events. In fact, while the probability distribution can generally be estimated given a model of low-energy strong interactions, the corresponding frequency distribution inside a single event typically deviates from the average and cannot be predicted a priori. We present preliminary results in this direction, and establish a connection between our technique and the particle weighting methods that have been the subject of recent investigation at the Large Hadron Collider

    Data-driven estimation of neutral pileup particle multiplicity in high-luminosity hadron collider environments

    Get PDF
    The upcoming operation regimes of the Large Hadron Collider are going to place stronger requirements on the rejection of particles originating from pileup, i.e. from interactions between other protons. For this reason, particle weighting techniques have recently been proposed in order to subtract pileup at the level of individual particles. We describe a choice of weights that, unlike others that rely on particle proximity, exploits the particle-level kinematic signatures of the high-energy scattering and of the pileup interactions. We illustrate the use of the weights to estimate the number density of neutral pileup particles inside individual events, and we elaborate on the complementarity between ours and other methods. We conclude by suggesting the idea of combining different sets of weights with a view to exploiting different features of the underlying processes for improved pileup subtraction at higher luminosity.High Energy Physics Group at Brunel University Londo

    Predictive factors for hepatocellular carcinoma recurrence after curative treatments

    Get PDF
    Hepatocellular carcinoma (HCC) is the fifth most common neoplasm worldwide. Recurrence of HCC after resection or loco-regional therapies represents an important clinical issue as it affects up to 70% of patients. This can be divided into early or late, if it occurs within or after 24 months after treatment, respectively. While the predictive factors for early recurrence are mainly related to tumour biology (local invasion and intrahepatic metastases), late recurrences are mainly related to de novo tumour formation. Thus, it is important to recognize these factors prior to any treatment in each patient, in order to optimize the treatment strategy and follow-up after treatment. The aim of this review is to summarize the current evidence available regarding predictive factors for the recurrence of HCC, according to the different therapeutic strategies available. In particular, we will discuss the role of new ultrasound-based techniques and biological features, such as tumor-related and circulating biomarkers, in predicting HCC recurrence. Recent advances in imaging-related parameters in computed-tomography scans and magnetic resonance imaging will also be discussed

    Are the Expanded Baveno VI Criteria really safe to screen compensated cirrhotic patients for high-risk varices?

    Get PDF
    The Expanded Baveno VI criteria [1] have been recently proposed as a new screening strategy for high-risk varices (HRV), able to increase the rate of spared upper endoscopies (EGDs) and improve upon the original Baveno VI Criteria [2]. To date, few studies have investigated the performance and safety of these criteria [3,4]. The recent work by Bae et al. [4] is the first one to report a high rate (>5%) of missed HRV by the expanded criteria, questioning their efficiency in safely ruling out HRV (sensitivity 81%, NPV 93%, LR- 0.30

    Clinical presentation of celiac disease and diagnosis accuracy in a single-center european pediatric cohort over 10 years

    Get PDF
    (1) Background: Changes in the clinical presentation of celiac disease (CD) in children have been reported. The guidelines of the European Society of Pediatric Gastroenterology, Hepatology and Nutrition (ESPGHAN) allow esophagogastroduodenoscopy (EGD) with biopsies to be avoided under specific circumstances. We aimed to assess the clinical picture of pediatric CD patients at diagnosis and to validate ESPGHAN non-biopsy criteria. (2) Methods: Patients with suspected CD or undergoing screening from 2004 to 2014 at the University Hospital in Modena, Italy were enrolled. The accuracy of ESPGHAN non-biopsy criteria and modified versions were assessed. (3) Results: In total, 410 patients were enrolled, of whom 403 were considered for analysis. Of the patients considered, 45 were asymptomatic and diagnosed with CD (11.2%) while 358 patients (88.2%) were symptomatic, of whom 295 were diagnosed with CD. Among symptomatic CD patients, 57 (19.3%) had gastrointestinal symptoms, 98 (33%) had atypical symptoms and 140 (47.4%) had both. No difference was found for the presence of gastrointestinal symptoms at different ages. The non-biopsy ESPGHAN criteria yielded an accuracy of 59.4% with a positive predictive value (PPV) of 100%; 173 out of 308 EGD (56.2%) could have been avoided. The modified 7× and 5× upper limit of normal cut-offs for IgA anti tissue-transglutaminase reached 60.7% and 64.3% of EGD avoided, respectively. (4) Conclusions: Over 10 years, late age at diagnosis and increased rates of atypical CD presentation were found. ESPGHAN non-biopsy criteria are accurate for CD diagnosis and allow half of unneeded EGD to be avoided. Modified versions allowed sparing a greater number of EGD

    Creatine Supplementation to Improve Sarcopenia in Chronic Liver Disease: Facts and Perspectives

    Get PDF
    Creatine supplementation has been one of the most studied and useful ergogenic nutritional support for athletes to improve performance, strength, and muscular mass. Over time creatine has shown beneficial effects in several human disease conditions. This review aims to summarise the current evidence for creatine supplementation in advanced chronic liver disease and its complications, primarily in sarcopenic cirrhotic patients, because this condition is known to be associated with poor prognosis and outcomes. Although creatine supplementation in chronic liver disease seems to be barely investigated and not studied in human patients, its potential efficacy on chronic liver disease is indirectly highlighted in animal models of non-alcoholic fatty liver disease, bringing beneficial effects in the fatty liver. Similarly, encephalopathy and fatigue seem to have beneficial effects. Creatine supplementation has demonstrated effects in sarcopenia in the elderly with and without resistance training suggesting a potential role in improving this condition in patients with advanced chronic liver disease. Creatine supplementation could address several critical points of chronic liver disease and its complications. Further studies are needed to support the clinical burden of this hypothesis
    corecore